Topics on Reduced Rank Methods for Multivariate Regression

نویسندگان

  • Ashin Mukherjee
  • Jie Cheng
چکیده

Topics in Reduced Rank methods for Multivariate Regression by Ashin Mukherjee Advisors: Professor Ji Zhu and Professor Naisyin Wang Multivariate regression problems are a simple generalization of the univariate regression problem to the situation where we want to predict q(> 1) responses that depend on the same set of features or predictors. Problems of this type is encountered commonly in many quantitative fields. The main goal is to build more accurate and interpretable models that can exploit the dependence structure among the responses and achieve dimension reduction. Reduced rank regression has been an important tool to this end due to its simplicity, computational efficiency and superior predictive performance than much more complex models. In The first two parts of this thesis we investigate certain important practical aspects of the reduced rank regression method such as handling collinearity in the design matrix, selection of optimal rank. The last part focuses on extensions of the reduced rank methods to general functional models. In Chapter 2 we emphasize that the usual reduced rank regression is vulnerable to high collinearity among the predictor variables as that can seriously distort the singular structure of the signal matrix. To address this we propose the reduced rank ridge regression method that incorporates a ridge penalty in addition to the low rank constraint on the coefficient matrix. Ridge penalty introduces shrinkage which allows us to avoid the singularities when predictors are collinear. We are able to develop a straightforward computational algorithm to solve the optimization problem. We also discuss a novel extension of the reduced rank methodology to the Reproducing Kernel Hilbert Space(RKHS) setting.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Simple Tests for Reduced Rank in Multivariate Regression

The present work proposes tests for reduced rank in multivariate regression coefficient matrices, under rather general conditions. A heuristic approach is to first estimate the regressions via standard methods, then compare the coefficient matrix rows (or columns) to assess their redundancy. A formal version of this approach utilizes the distance between an unrestricted coefficient matrix estim...

متن کامل

Reduced rank ridge regression and its kernel extensions

In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced...

متن کامل

Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection in Multivariate Regression

The reduced-rank regression is an effective method to predict multiple response variables from the same set of predictor variables, because it can reduce the number of model parameters as well as take advantage of interrelations between the response variables and therefore improve predictive accuracy. We propose to add a new feature to the reduced-rank regression that allows selection of releva...

متن کامل

Biplots in Reduced - Rank Regression

2 SUMMARY Regression problems with a number of related response variables are typically analyzed by separate multiple regressions. This paper shows how these regressions can be visualized jointly in a biplot based on reduced-rank regression. Reduced-rank regression combines multiple regression and principal components analysis and can therefore be carried out with standard statistical packages....

متن کامل

Nonparametric Reduced Rank Regression

We propose an approach to multivariate nonparametric regression that generalizes reduced rank regression for linear models. An additive model is estimated for each dimension of a q-dimensional response, with a shared p-dimensional predictor variable. To control the complexity of the model, we employ a functional form of the Ky-Fan or nuclear norm, resulting in a set of function estimates that h...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013